Goto

Collaborating Authors

 searching recurrent architecture


Interstellar: Searching Recurrent Architecture for Knowledge Graph Embedding

Neural Information Processing Systems

Knowledge graph (KG) embedding is well-known in learning representations of KGs. Many models have been proposed to learn the interactions between entities and relations of the triplets. However, long-term information among multiple triplets is also important to KG. In this work, based on the relational paths, which are composed of a sequence of triplets, we define the Interstellar as a recurrent neural architecture search problem for the short-term and long-term information along the paths. First, we analyze the difficulty of using a unified model to work as the Interstellar. Then, we propose to search for recurrent architecture as the Interstellar for different KG tasks. A case study on synthetic data illustrates the importance of the defined search problem. Experiments on real datasets demonstrate the effectiveness of the searched models and the efficiency of the proposed hybrid-search algorithm.


Review for NeurIPS paper: Interstellar: Searching Recurrent Architecture for Knowledge Graph Embedding

Neural Information Processing Systems

The motivation for defining "path interstellar" is strong and clearly stated. By comparing the learning ability of triplet-based, path-based, and GCN-based methods, the path interstellar (Definition 1) is proposed as the basic model to learn from KGs. This motivation has also been verified by a case study on synthetic data (experiments in section 4.2). - Domain-specific and well-defined search space. The authors propose a novel recurrent search space specific for the path learning problem. The searched components are either motivated by the models in the literature (combinators, activations) or by the learning problem (connections).


Review for NeurIPS paper: Interstellar: Searching Recurrent Architecture for Knowledge Graph Embedding

Neural Information Processing Systems

The paper describes a recurrent neural architecture search technique to leverage path information in knowledge graphs. This is an important contribution both in terms of architecture design and in terms of improving the state of the art.


Interstellar: Searching Recurrent Architecture for Knowledge Graph Embedding

Neural Information Processing Systems

Knowledge graph (KG) embedding is well-known in learning representations of KGs. Many models have been proposed to learn the interactions between entities and relations of the triplets. However, long-term information among multiple triplets is also important to KG. In this work, based on the relational paths, which are composed of a sequence of triplets, we define the Interstellar as a recurrent neural architecture search problem for the short-term and long-term information along the paths. First, we analyze the difficulty of using a unified model to work as the Interstellar.